9 research outputs found

    Aptamer-based multiplexed proteomic technology for biomarker discovery

    Get PDF
    Interrogation of the human proteome in a highly multiplexed and efficient manner remains a coveted and challenging goal in biology. We present a new aptamer-based proteomic technology for biomarker discovery capable of simultaneously measuring thousands of proteins from small sample volumes (15 [mu]L of serum or plasma). Our current assay allows us to measure ~800 proteins with very low limits of detection (1 pM average), 7 logs of overall dynamic range, and 5% average coefficient of variation. This technology is enabled by a new generation of aptamers that contain chemically modified nucleotides, which greatly expand the physicochemical diversity of the large randomized nucleic acid libraries from which the aptamers are selected. Proteins in complex matrices such as plasma are measured with a process that transforms a signature of protein concentrations into a corresponding DNA aptamer concentration signature, which is then quantified with a DNA microarray. In essence, our assay takes advantage of the dual nature of aptamers as both folded binding entities with defined shapes and unique sequences recognizable by specific hybridization probes. To demonstrate the utility of our proteomics biomarker discovery technology, we applied it to a clinical study of chronic kidney disease (CKD). We identified two well known CKD biomarkers as well as an additional 58 potential CKD biomarkers. These results demonstrate the potential utility of our technology to discover unique protein signatures characteristic of various disease states. More generally, we describe a versatile and powerful tool that allows large-scale comparison of proteome profiles among discrete populations. This unbiased and highly multiplexed search engine will enable the discovery of novel biomarkers in a manner that is unencumbered by our incomplete knowledge of biology, thereby helping to advance the next generation of evidence-based medicine

    Evaluation of operational ocean forecasting systems from the perspective of the users and the experts

    Get PDF
    The Intergovernmental Oceanographic Commission (IOC) has an Ocean Decade Implementation Plan (UNESCO-IOC, 2021) that states seven outcomes required for the ocean we want, with the fourth outcome being “A predicted ocean where society understands and can respond to changing ocean conditions.” To facilitate the achievement of this goal, the IOC has endorsed Mercator Ocean International to implement the Decade Collaborative Center (DCC) for OceanPrediction (https://www.mercator-ocean.eu/oceanprediction/, last access: 21 August 2023), which is a cross-cutting structure that will work to develop global-scale collaboration between Decade Actions related to ocean prediction

    Factors influencing the educational and occupational choices of women

    Get PDF
    The purpose of this study is to examine self-efficacy in mathematics and various motivating factors among female college students who choose college majors in traditionally female-dominated fields as compared to those who chose college majors in traditionally male-dominated fields. The Mathematics Self-Efficacy Scale and an adapted version of the College Survey were administered to forty-six female college students. Differences between groups in the outcomes of the surveys were measured using a one-way ANOVA for the MSES and a chi-square and gamma test for individual items from the College Survey. The findings were that the MSES indicated differences between the groups in the manner expected, but the differences did not prove to be statistically significant. There were statistically significant differences between the groups on different items from the adapted College Survey that indicated differences in motivational factors contributing to choice of college major

    Challenges and Opportunities with Big Data 2011-1

    Get PDF
    The promise of data-driven decision-making is now being recognized broadly, and there is growing enthusiasm for the notion of ``Big Data.’’ While the promise of Big Data is real -- for example, it is estimated that Google alone contributed 54 billion dollars to the US economy in 2009 -- there is currently a wide gap between its potential and its realization.Heterogeneity, scale, timeliness, complexity, and privacy problems with Big Data impede progress at all phases of the pipeline that can create value from data. The problems start right away during data acquisition, when the data tsunami requires us to make decisions, currently in an ad hoc manner, about what data to keep and what to discard, and how to store what we keep reliably with the right metadata. Much data today is not natively in structured format; for example, tweets and blogs are weakly structured pieces of text, while images and video are structured for storage and display, but not for semantic content and search: transforming such content into a structured format for later analysis is a major challenge. The value of data explodes when it can be linked with other data, thus data integration is a major creator of value. Since most data is directly generated in digital format today, we have the opportunity and the challenge both to influence the creation to facilitate later linkage and to automatically link previously created data. Data analysis, organization, retrieval, and modeling are other foundational challenges. Data analysis is a clear bottleneck in many applications, both due to lack of scalability of the underlying algorithms and due to the complexity of the data that needs to be analyzed. Finally, presentation of the results and its interpretation by non-technical domain experts is crucial to extracting actionable knowledge.During the last 35 years, data management principles such as physical and logical independence, declarative querying and cost-based optimization have led, during the last 35 years, to a multi-billion dollar industry. More importantly, these technical advances have enabled the first round of business intelligence applications and laid the foundation for managing and analyzing Big Data today. The many novel challenges and opportunities associated with Big Data necessitate rethinking many aspects of these data management platforms, while retaining other desirable aspects. We believe that appropriate investment in Big Data will lead to a new wave of fundamental technological advances that will be embodied in the next generations of Big Data management and analysis platforms, products, and systems.We believe that these research problems are not only timely, but also have the potential to create huge economic value in the US economy for years to come. However, they are also hard, requiring us to rethink data analysis systems in fundamental ways. A major investment in Big Data, properly directed, can result not only in major scientific advances, but also lay the foundation for the next generation of advances in science, medicine, and business

    Annual Selected Bibliography

    No full text
    corecore